15 research outputs found

    Hybrid ACO and SVM algorithm for pattern classification

    Get PDF
    Ant Colony Optimization (ACO) is a metaheuristic algorithm that can be used to solve a variety of combinatorial optimization problems. A new direction for ACO is to optimize continuous and mixed (discrete and continuous) variables. Support Vector Machine (SVM) is a pattern classification approach originated from statistical approaches. However, SVM suffers two main problems which include feature subset selection and parameter tuning. Most approaches related to tuning SVM parameters discretize the continuous value of the parameters which will give a negative effect on the classification performance. This study presents four algorithms for tuning the SVM parameters and selecting feature subset which improved SVM classification accuracy with smaller size of feature subset. This is achieved by performing the SVM parameters’ tuning and feature subset selection processes simultaneously. Hybridization algorithms between ACO and SVM techniques were proposed. The first two algorithms, ACOR-SVM and IACOR-SVM, tune the SVM parameters while the second two algorithms, ACOMV-R-SVM and IACOMV-R-SVM, tune the SVM parameters and select the feature subset simultaneously. Ten benchmark datasets from University of California, Irvine, were used in the experiments to validate the performance of the proposed algorithms. Experimental results obtained from the proposed algorithms are better when compared with other approaches in terms of classification accuracy and size of the feature subset. The average classification accuracies for the ACOR-SVM, IACOR-SVM, ACOMV-R and IACOMV-R algorithms are 94.73%, 95.86%, 97.37% and 98.1% respectively. The average size of feature subset is eight for the ACOR-SVM and IACOR-SVM algorithms and four for the ACOMV-R and IACOMV-R algorithms. This study contributes to a new direction for ACO that can deal with continuous and mixed-variable ACO

    Solving SVM model selection problem using ACOR and IACOR

    Get PDF
    Ant Colony Optimization (ACO) has been used to solve Support Vector Machine (SVM) model selection problem.ACO originally deals with discrete optimization problem. In applying ACO for optimizing SVM parameters which are continuous variables, there is a need to discretize the continuously value into discrete values.This discretize process would result in loss of some information and hence affect the classification accuracy.In order to enhance SVM performance and solving the discretization problem, this study proposes two algorithms to optimize SVM parameters using Continuous ACO (ACOR) and Incremental Continuous Ant Colony Optimization (IACOR) without the need to discretize continuous value for SVM parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed integrated algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM. Results have also shown that IACOR-SVM is better than ACOR-SVM in terms of classification accuracy

    Mixed variable ant colony optimization technique for feature subset selection and model selection

    Get PDF
    This paper presents the integration of Mixed Variable Ant Colony Optimization and Support Vector Machine (SVM) to enhance the performance of SVM through simultaneously tuning its parameters and selecting a small number of features.The process of selecting a suitable feature subset and optimizing SVM parameters must occur simultaneously,because these processes affect each ot her which in turn will affect the SVM performance.Thus producing unacceptable classification accuracy.Five datasets from UCI were used to evaluate the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with the small size of features subset

    Solving Support Vector Machine Model Selection Problem Using Continuous Ant Colony Optimization

    Get PDF
    Ant Colony Optimization has been used to solve Support Vector Machine model selection problem.Ant Colony Optimization originally deals with discrete optimization problem.In applying Ant Colony Optimization for optimizing Support Vector Machine parameters which are continuous variables, there is a need to discretize the continuously value into discrete value.This discretize process would result in loss of some information and hence affect the classification accuracy and seeking time.This study proposes an algorithm that can optimize Support Vector Machine parameters using Continuous Ant Colony Optimization without the need to discretize continuous value for Support Vector Machine parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed hybrid algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM

    Feature selection and model selection algorithm using incremental mixed variable ant colony optimization for support vector machine classifier

    Get PDF
    Support Vector Machine (SVM) is a present day classification approach originated from statistical approaches.Two main problems that influence the performance of SVM are selecting feature subset and SVM model selection. In order to enhance SVM performance, these problems must be solved simultaneously because error produced from the feature subset selection phase will affect the values of the SVM parameters and resulted in low classification accuracy.Most approaches related with solving SVM model selection problem will discretize the continuous value of SVM parameters which will influence its performance.Incremental Mixed Variable Ant Colony Optimization (IACOMV) has the ability to solve SVM model selection problem without discretising the continuous values and simultaneously solve the two problems.This paper presents an algorithm that integrates IACOMV and SVM.Ten datasets from UCI were used to evaluate the performance of the proposed algorithm.Results showed that the proposed algorithm can enhance the classification accuracy with small number of features

    Optimizing support vector machine parameters using continuous ant colony optimization

    Get PDF
    Support Vector Machines are considered to be excellent patterns classification techniques.The process of classifying a pattern with high classification accuracy counts mainly on tuning Support Vector Machine parameters which are the generalization error parameter and the kernel function parameter.Tuning these parameters is a complex process and may be done experimentally through time consuming human experience.To overcome this difficulty, an approach such as Ant Colony Optimization can tune Support Vector Machine parameters.Ant Colony Optimization originally deals with discrete optimization problems. Hence, in applying Ant Colony Optimization for optimizing Support Vector Machine parameters, which are continuous parameters, there is a need to discretize the continuous value into a discrete value.This discretization process results in loss of some information and, hence, affects the classification accuracy and seek time.This study proposes an algorithm to optimize Support Vector Machine parameters using continuous Ant Colony Optimization without the need to discretize continuous values for Support Vector Machine parameters.Seven datasets from UCI were used to evaluate the performance of the proposed hybrid algorithm.The proposed algorithm demonstrates the credibility in terms of classification accuracy when compared to grid search techniques.Experimental results of the proposed algorithm also show promising performance in terms of computational speed

    Formulating new enhanced pattern classification algorithms based on ACO-SVM

    Get PDF
    This paper presents two algorithms that integrate new Ant Colony Optimization (ACO) variants which are Incremental Continuous Ant Colony Optimization (IACOR) and Incremental Mixed Variable Ant Colony Optimization (IACOMV) with Support Vector Machine (SVM) to enhance the performance of SVM.The first algorithm aims to solve SVM model selection problem. ACO originally deals with discrete optimization problem.In applying ACO for solving SVM model selection problem which are continuous variables, there is a need to discretize the continuously value into discrete values.This discretization process would result in loss of some information and hence affects the classification accuracy and seeking time.In this algorithm we propose to solve SVM model selection problem using IACOR without the need to discretize continuous value for SVM.The second algorithm aims to simultaneously solve SVM model selection problem and selects a small number of features.SVM model selection and selection of suitable and small number of feature subsets must occur simultaneously because error produced from the feature subset selection phase will affect the values of SVM model selection and result in low classification accuracy.In this second algorithm we propose the use of IACOMV to simultaneously solve SVM model selection problem and features subset selection.Ten benchmark datasets were used to evaluate the proposed algorithms.Results showed that the proposed algorithms can enhance the classification accuracy with small size of features subset

    Cancellable face template algorithm based on speeded-up robust features and winner-takes-all

    Get PDF
    Features such as face, fingerprint, and iris imprints have been used for authentication in biometric system. The toughest feature amongst these is the face. Extracting a region with the most potential face features from an image for biometric identification followed by illumination enhancement is a commonly used method. However, the region of interest extraction followed by illumination enhancement is sensitive to image face feature displacement, skewed image, and bad illumination. This research presents a cancell able face image algorithm built upon the speeded-up robust features method to extract and select features. A speeded-up robust feature approach is utilised for the image’s features extraction, while Winner-Takes-All hashing is utilised for match-seeking. Finally, the features vectors are projected by utilising a random form of binary orthogonal matrice. Experiments were conducted on Yale and ORL datasets which provide gray scale images of sizes 168 × 192 and 112 × 92 pixels, respectively. The execution of the proposed algorithm was measured against several algorithms using equal error rate metric. It is found that the proposed algorithm produced an acceptable performance which indicates that this algorithm can be used in biometric security applications

    Incremental continuous ant colony optimization technique for support vector machine model selection problem

    Get PDF
    Ant Colony Optimization has been used to solve Support Vector Machine model selection problem.Ant Colony Optimization originally deals with discrete optimization problem. In applying Ant Colony Optimization for optimizing Support Vector Machine parameters which are continuous variables, there is a need to discretize the continuously value into discrete value.This discretize process would result in loss of some information and hence affect the classification accuracy and seeking time. This study proposes an algorithm that can optimize Support Vector Machine parameters using Incremental Continuous Ant Colony Optimization without the need to discretize continuous value for support vector machine parameters.Seven datasets from UCI were used to evaluate the credibility of the proposed hybrid algorithmin terms of classification accuracy.Promising results were obtained when compared to grid search technique

    Integrated ACOR/IACOMV-R-SVM Algorithm

    Get PDF
    A direction for ACO is to optimize continuous and mixed (discrete and continuous) variables in solving problems with various types of data. Support Vector Machine (SVM), which originates from the statistical approach, is a present day classification technique. The main problems of SVM are selecting feature subset and tuning the parameters. Discretizing the continuous value of the parameters is the most common approach in tuning SVM parameters. This process will result in loss of information which affects the classification accuracy. This paper presents two algorithms that can simultaneously tune SVM parameters and select the feature subset. The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. Three benchmark UCI datasets were used in the experiments to validate the performance of the proposed algorithms. The results show that the proposed algorithms have good performances as compared to other approaches
    corecore